Information, Divergence and Risk for Binary Experiments
نویسندگان
چکیده
We unify f -divergences, Bregman divergences, surrogate loss bounds (regret bounds), proper scoring rules, matching losses, cost curves, ROC-curves and information. We do this by systematically studying integral and variational representations of these objects and in so doing identify their primitives which all are related to cost-sensitive binary classification. As well as clarifying relationships between generative and discriminative views of learning, the new machinery leads to tight and more general surrogate loss bounds and generalised Pinsker inequalities relating f -divergences to variational divergence. The new viewpoint illuminates existing algorithms: it provides a new derivation of Support Vector Machines in terms of divergences and relates Maximum Mean Discrepancy to Fisher Linear Discriminants. It also suggests new techniques for estimating f -divergences.
منابع مشابه
A Novel Hybrid Modified Binary Particle Swarm Optimization Algorithm for the Uncertain p-Median Location Problem
Here, we investigate the classical p-median location problem on a network in which the vertex weights and the distances between vertices are uncertain. We propose a programming model for the uncertain p-median location problem with tail value at risk objective. Then, we show that it is NP-hard. Therefore, a novel hybrid modified binary particle swarm optimization algorithm is presented to obtai...
متن کاملDPML-Risk: An Efficient Algorithm for Image Registration
Targets and objects registration and tracking in a sequence of images play an important role in various areas. One of the methods in image registration is feature-based algorithm which is accomplished in two steps. The first step includes finding features of sensed and reference images. In this step, a scale space is used to reduce the sensitivity of detected features to the scale changes. Afterw...
متن کاملDecision making in medical investigations using new divergence measures for intuitionistic fuzzy sets
In recent times, intuitionistic fuzzy sets introduced by Atanassov has been one of the most powerful and flexible approaches for dealing with complex and uncertain situations of real world. In particular, the concept of divergence between intuitionistic fuzzy sets is important since it has applications in various areas such as image segmentation, decision making, medical diagnosis, pattern reco...
متن کاملA note on decision making in medical investigations using new divergence measures for intuitionistic fuzzy sets
Srivastava and Maheshwari (Iranian Journal of Fuzzy Systems 13(1)(2016) 25-44) introduced a new divergence measure for intuitionisticfuzzy sets (IFSs). The properties of the proposed divergence measurewere studied and the efficiency of the proposed divergence measurein the context of medical diagnosis was also demonstrated. In thisnote, we point out some errors in ...
متن کاملInformation Measures via Copula Functions
In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Journal of Machine Learning Research
دوره 12 شماره
صفحات -
تاریخ انتشار 2011